491 research outputs found
Tracking performance for long-lived particles at LHCb
The LHCb experiment is dedicated to the study of the and hadron
decays, including long-lived particles such as and strange baryons
(, , etc... ). These kind of particles are difficult to
reconstruct by the LHCb tracking system since they escape detection in the
first tracker. A new method to evaluate the performance of the different
tracking algorithms for long-lived particles using real data samples has been
developed. Special emphasis is laid on particles hitting only part of the
tracking system of the new LHCb upgrade detector.Comment: Proceeding for Connecting the Dots and Workshop on Intelligent
Trackers (CTD/WIT 2019
Radiative -baryon decays to measure the photon and -baryon polarization
The radiative decays of -baryons facilitate the direct measurement of
photon helicity in transitions thus serving as an important test
of physics beyond the Standard Model. In this paper we analyze the complete
angular distribution of ground state -baryon ( and
) radiative decays to multibody final states assuming an initially
polarized -baryon sample. Our sensitivity study suggests that the photon
polarization asymmetry can be extracted to a good accuracy along with a
simultaneous measurement of the initial -baryon polarization. With higher
yields of -baryons, achievable in subsequent runs of the Large Hadron
Collider (LHC), we find that the photon polarization measurement can play a
pivotal role in constraining different new physics scenarios.Comment: Typos corrected, reference adde
A Roadmap for HEP Software and Computing R&D for the 2020s
Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to prepare for this software upgrade.Peer reviewe
Triggering new discoveries: development of advanced HLT1 algorithms for detection of long-lived particles at LHCb
The work presented in this thesis constitutes a significant contribution to the first high level trigger (HLT1) of the LHCb experiment, based on the Allen project. In Allen, the entire HLT1 sequence of reconstruction algorithms has been designed to be executed on GPU cards. The work in this thesis has contributed to propel the project forward, enabling the LHCb trigger during the Run3, to successfully select real-time events at a frequency of MHz. An extensive effort has been performed during the Allen development program, leading to the creation of a Allen performance portability layer which enables framework to be executed in several architectures. Furthermore, inside this framework contribution to several key algorithms have been presented. One of these algorithms, termed HybridSeeding, efficiently reconstructs the tracks produced in the SciFi detector (T-tracks). Another algorithm, named VELO-SciFi Matching, building upon the former, allows the reconstruction of long tracks with a momentum precision better than . Additionally, a new algorithm named Downstream has been conceived, developed and incorporated into HLT1 for first time. A fast and efficient search of hits in the UT detector is performed, and a fast neural network (NN) is applied to reject ghost tracks. It allows to reconstruct downstream tracks with an efficiency of and a ghost rate below . This is the first time that a NN is developed for GPUs inside Allen. This new algorithm will allow the selection of long-lived particles at HLT1 level, opening up new opportunities within both the Standard Model and its extensions. Of particular note is its implication in expanding the search scope for exotic long-lived particles, spanning from 100 ps to several nanosecons, a domain unexplored until now by the LHCb experiment. This, in turn, enhances the sensitivity to new particles predicted by theories that include a dark sector, heavy neutral leptons, supersymmetry, or axion-like particles. In addition, the LHCbâs ability to detect particles from the Standard Model, such as and K, is greatly augmented, thereby enhancing the precision of analyses involving b and c hadron decays. The integration of the HLT1 selection lines derived from the Downstream algorithm into the LHCbâs real-time monitoring infrastructure will be important for the data taking during Run3 and beyond, and notably for the present alignment and calibration of the UT detector. The precision in measuring observables which are sensitive to physics beyond the Standard Model, such as the rare decay channel, will be greatly augmented. In this thesis a study of the measurement of the branching fraction of the decay relative to the channel has been performed. The analysis procedure, including selection, reconstruction and background rejection, has been described. A evaluation of the main systematic uncertainties affecting the measurement has been included. It has been concluded that the statistical precision for Run3 will be below as a result of the inclusion of downstream tracks. The measurement of the photon polarisation in these transitions will also benefit from the increase in the yield, reaching a precision in the parameter. Measurements of the CP asymmetry in decays will also reach higher precision
Standalone track reconstruction and matching algorithms for GPU-based High level trigger at LHCb
The LHCb Upgrade in Run 3 has changed its trigger scheme for a full software selection in two steps. The first step, HLT1, will be entirely implemented on GPUs and run a fast selection aiming at reducing the visible collision rate from 30 MHz to 1 MHz. This selection relies on a partial reconstruction of the event. A version of this reconstruction starts with two monolithic tracking algorithms, the VELO-pixel tracking and the HybridSeeding on Scintillating-Fiber tracker, which reconstructs track segments in standalone sub-detectors. Those segments are then joined through a matching algorithm in order to produce âlongâ tracks, which form the base of the HLT1 reconstruction. We discuss the principle of these algorithms as well as the details of their implementation which allows them to run at a high-throughput configuration. An emphasis is put on the optimizations of the algorithms themselves in order to take advantage of the GPU architecture. Finally, results are presented in the context of the LHCb performance requirements for Run 3
Radiative B Decays at LHCb
Rare radiative B decays are sensitive probes of New Physics through the study of branching fractions, CP asymmetries and other observables related to the photon polarization. The LHCb experiment has performed several measurements with radiative B decays. These results provide constraints on predictions of models beyond the Standard Model, and are at present key to understanding the nature of flavor physics
Effect of the high-level trigger for detecting long-lived particles at LHCb
International audienceLong-lived particles (LLPs) show up in many extensions of the Standard Model, yet are challenging to search for with current detectors, due to their very displaced vertices. This article evaluates the ability of the trigger algorithms used in the LHCb experiment to detect long-lived particles, and work to adapt them in order to enhance the sensitivity of this experiment to undiscovered long-lived particles. A model with a Higgs portal to a dark sector is tested, and the sensitivity reach is discussed. In the LHCb tracking system, the farthest tracking station from the collision point is the Scintillating fiber tracker, the SciFi detector. One of the challenges in the track reconstruction is to deal with the large amount and combinatorics of hits in this detector. A dedicated algorithm has been developed to cope with the large data output. When fully implemented, this algorithm would greatly increase the available statistics for any long-lived particle search in the forward region, and would additionally improve the sensitivity of analyses dealing with Standard Model particles of large lifetime, such as Ks or Lambda hadrons
- âŠ